MPE Inference in Conditional Linear Gaussian Networks
نویسندگان
چکیده
Given evidence on a set of variables in a Bayesian network, the most probable explanation (MPE) is the problem of finding a configuration of the remaining variables with maximum posterior probability. This problem has previously been addressed for discrete Bayesian networks and can be solved using inference methods similar to those used for finding posterior probabilities. However, when dealing with hybrid Bayesian networks, such as conditional linear Gaussian (CLG) networks, the MPE problem has only received little attention. In this paper, we provide insights into the general problem of finding an MPE configuration in a CLG network. For solving this problem, we devise an algorithm based on bucket elimination and with the same computational complexity as that of calculating posterior marginals in a CLG network. We illustrate the workings of the algorithm using a detailed numerical example, and discuss possible extensions of the algorithm for handling the more general problem of finding a maximum a posteriori hypothesis (MAP).
منابع مشابه
Exact Inference on Conditional Linear Γ-Gaussian Bayesian Networks
Exact inference for Bayesian Networks is only possible for quite limited classes of networks. Examples of such classes are discrete networks, conditional linear Gaussian networks, networks using mixtures of truncated exponentials, and networks with densities expressed as truncated polynomials. This paper defines another class with exact inference, based on the normal inverse gamma conjugacy. We...
متن کاملOperations for inference in continuous Bayesian networks with linear deterministic variables
An important class of continuous Bayesian networks are those that have linear conditionally deterministic variables (a variable that is a linear deterministic function of its parents). In this case, the joint density function for the variables in the network does not exist. Conditional linear Gaussian (CLG) distributions can handle such cases when all variables are normally distributed. In this...
متن کاملDynamic Bayesian Networks with Deterministic Latent Tables
The application of latent/hidden variable Dynamic Bayesian Networks is constrained by the complexity of marginalising over latent variables. For this reason either small latent dimensions or Gaussian latent conditional tables linearly dependent on past states are typically considered in order that inference is tractable. We suggest an alternative approach in which the latent variables are model...
متن کاملModeling Nonlinear Deterministic Relationships in Bayesian Networks
In a Bayesian network with continuous variables containing a variable(s) that is a conditionally deterministic function of its continuous parents, the joint density function for the variables in the network does not exist. Conditional linear Gaussian distributions can handle such cases when the deterministic function is linear and the continuous variables have a multi-variate normal distributio...
متن کاملInference in Hybrid Bayesian Networks with Deterministic Variables
An important class of hybrid Bayesian networks are those that have conditionally deterministic variables (a variable that is a deterministic function of its parents). In this case, if some of the parents are continuous, then the joint density function does not exist. Conditional linear Gaussian (CLG) distributions can handle such cases when the deterministic function is linear and continuous va...
متن کامل